Kernel PLS-SVC for Linear and Nonlinear Classification
نویسندگان
چکیده
A new method for classification is proposed. This is based on kernel orthonormalized partial least squares (PLS) dimensionality reduction of the original data space followed by a support vector classifier. Unlike principal component analysis (PCA), which has previously served as a dimension reduction step for discrimination problems, orthonormalized PLS is closely related to Fisher’s approach to linear discrimination or equivalently to canonical correlation analysis. For this reason orthonormalized PLS is preferable to PCA for discrimination. Good behavior of the proposed method is demonstrated on 13 different benchmark data sets and on the real world problem of classifying finger movement periods from non-movement periods based on electroencephalograms.
منابع مشابه
Increasing the accuracy of the classification of diabetic patients in terms of functional limitation using linear and nonlinear combinations of biomarkers: Ramp AUC method
The Area under the ROC Curve (AUC) is a common index for evaluating the ability of the biomarkers for classification. In practice, a single biomarker has limited classification ability, so to improve the classification performance, we are interested in combining biomarkers linearly and nonlinearly. In this study, while introducing various types of loss functions, the Ramp AUC method and some of...
متن کاملLinear and Nonlinear Multivariate Classification of Iranian Bottled Mineral Waters According to Their Elemental Content Determined by ICP-OES
The combinations of inductively coupled plasma-optical emission spectrometry (ICP-OES) and three classification algorithms, i.e., partial least squares discriminant analysis (PLS-DA), least squares support vector machine (LS-SVM) and soft independent modeling of class analogies (SIMCA), for discriminating different brands of Iranian bottled mineral waters, were explored. ICP-OES was used for th...
متن کاملB - 439 Support Vector Machine Based on Conditional Value - at - Risk Minimization Akiko
A binary linear classification method, CGS method, was recently proposed by Gotoh and Takeda. The classification model was developed by introducing a risk measure known as the conditional value-at-risk (β-CVaR). CVaR minimization for the margin distribution leads to CGS problem, equivalent to ν-SVC of Schölkopf et al. in the convex case and Extended ν-SVC of Perez-Cruz et al. in the nonconvex c...
متن کاملKernel logistic PLS: A tool for supervised nonlinear dimensionality reduction and binary classification
Kernel logistic PLS” (KL-PLS) is a new tool for supervised nonlinear dimensionality reduction and binary classification. The principles of KL-PLS are based on both PLS latent variables construction and learning with kernels. The KL-PLS algorithm can be seen as a supervised dimensionality reduction (complexity control step) followed by a classification based on logistic regression. The algorithm...
متن کاملConstructing Orthogonal Latent Features for Arbitrary Loss
A boosting framework for constructing orthogonal features targeted to a given loss function is developed. Combined with techniques from spectral methods such as PCA and PLS, an orthogonal boosting algorithm for linear hypothesis is used to efficiently construct orthogonal latent features selected to optimize the given loss function. The method is generalized to construct orthogonal nonlinear fe...
متن کامل